Pattern Completion in Symmetric Threshold-Linear Networks

نویسندگان

  • Carina Curto
  • Katherine Morrison
چکیده

Threshold-linear networks are a common class of firing rate models that describe recurrent interactions among neurons. Unlike their linear counterparts, these networks generically possess multiple stable fixed points (steady states), making them viable candidates for memory encoding and retrieval. In this work, we characterize stable fixed points of general threshold-linear networks with constant external drive and discover constraints on the coexistence of fixed points involving different subsets of active neurons. In the case of symmetric networks, we prove the following antichain property: if a set of neurons [Formula: see text] is the support of a stable fixed point, then no proper subset or superset of [Formula: see text] can support a stable fixed point. Symmetric threshold-linear networks thus appear to be well suited for pattern completion, since the dynamics are guaranteed not to get stuck in a subset or superset of a stored pattern. We also show that for any graph G, we can construct a network whose stable fixed points correspond precisely to the maximal cliques of G. As an application, we design network decoders for place field codes and demonstrate their efficacy for error correction and pattern completion. The proofs of our main results build on the theory of permitted sets in threshold-linear networks, including recently developed connections to classical distance geometry.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

High Capacity, Small World Associative Memory Models NEIL DAVEY, LEE CALCRAFT and ROD ADAMS

Models of associative memory usually have full connectivity or if diluted, random symmetric connectivity. In contrast, biological neural systems have predominantly local, non-symmetric connectivity. Here we investigate sparse networks of threshold units, trained with the perceptron learning rule. The units are given position and are arranged in a ring. The connectivity graph varies between bein...

متن کامل

Pattern retrieval in threshold-linear associative nets.

Networks of threshold-linear neurons have previously been introduced and analysed as distributed associative memory systems. Here, results from simulations of pattern retrieval in a large-scale, sparsely connected network are presented. The storage capacity lies near a = 0.8 and 1.2 for binary and ternary patterns respectively, in reasonable accordance with theoretical estimates. The system is ...

متن کامل

Stability of the replica symmetric solution for the information conveyed by a neural network

The information that a pattern of firing in the output layer of a feedforward network of threshold-linear neurons conveys about the network’s inputs is considered. A replica-symmetric solution is found to be stable for all but small amounts of noise. The region of instability depends on the contribution of the threshold and the sparseness: for distributed pattern distributions, the unstable reg...

متن کامل

Stability of the replica symmetric solution for the information conveyed by by a neural network

The information that a pattern of firing in the output layer of a feedforward network of threshold-linear neurons conveys about the network’s inputs is considered. A replica-symmetric solution is found to be stable for all but small amounts of noise. The region of instability depends on the contribution of the threshold and the sparseness: for distributed pattern distributions, the unstable reg...

متن کامل

Permitted and Forbidden Sets in Symmetric Threshold-Linear Networks

The richness and complexity of recurrent cortical circuits is an inexhaustible source of inspiration for thinking about high-level biological computation. In past theoretical studies, constraints on the synaptic connection patterns of threshold-linear networks were found that guaranteed bounded network dynamics, convergence to attractive fixed points, and multistability, all fundamental aspects...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Neural computation

دوره 28 12  شماره 

صفحات  -

تاریخ انتشار 2016